Minimax rates for conditional density estimation via empirical entropy

نویسندگان

چکیده

We consider the task of estimating a conditional density using i.i.d. samples from joint distribution, which is fundamental problem with applications in both classification and uncertainty quantification for regression. For estimation, minimax rates have been characterized general classes terms uniform (metric) entropy, well-studied notion statistical capacity. When applying these results to use entropy -- infinite when covariate space unbounded suffers curse dimensionality can lead suboptimal rates. Consequently, estimation cannot be classical results. resolve this well-specified models, obtaining matching (within logarithmic factors) upper lower bounds on Kullback--Leibler risk empirical Hellinger class. The allows us appeal concentration arguments based local Rademacher complexity, contrast leads large, potentially nonparametric captures correct dependence complexity space. Our require only that densities are bounded above, do not they below or otherwise satisfy any tail conditions.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularized Minimax Conditional Entropy for Crowdsourcing

There is a rapidly increasing interest in crowdsourcing for data labeling. By crowdsourcing, a large number of labels can be often quickly gathered at low cost. However, the labels provided by the crowdsourcing workers are usually not of high quality. In this paper, we propose a minimax conditional entropy principle to infer ground truth from noisy crowdsourced labels. Under this principle, we ...

متن کامل

Empirical Entropy, Minimax Regret and Minimax Risk

We consider the random design regression with square loss. We propose a method that aggregates empirical minimizers (ERM) over appropriately chosen random subsets and reduces to ERM in the extreme case, and we establish exact oracle inequalities for its risk. We show that, under the −p growth of the empirical -entropy, the excess risk of the proposed method attains the rate n− 2 2+p for p ∈ (0,...

متن کامل

Conditional Density Estimation with Dimensionality Reduction via Squared-Loss Conditional Entropy Minimization

Regression aims at estimating the conditional mean of output given input. However, regression is not informative enough if the conditional density is multimodal, heteroskedastic, and asymmetric. In such a case, estimating the conditional density itself is preferable, but conditional density estimation (CDE) is challenging in high-dimensional space. A naive approach to coping with high dimension...

متن کامل

Conditional Density Estimation via Least-Squares Density Ratio Estimation

Estimating the conditional mean of an inputoutput relation is the goal of regression. However, regression analysis is not sufficiently informative if the conditional distribution has multi-modality, is highly asymmetric, or contains heteroscedastic noise. In such scenarios, estimating the conditional distribution itself would be more useful. In this paper, we propose a novel method of condition...

متن کامل

Minimax Density Estimation for Growing Dimension

This paper presents minimax rates for density estimation when the data dimension d is allowed to grow with the number of observations n rather than remaining fixed as in previous analyses. We prove a non-asymptotic lower bound which gives the worst-case rate over standard classes of smooth densities, and we show that kernel density estimators achieve this rate. We also give oracle choices for t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Annals of Statistics

سال: 2023

ISSN: ['0090-5364', '2168-8966']

DOI: https://doi.org/10.1214/23-aos2270